High-dimensional covariance decomposition into sparse Markov and independence models

نویسندگان

  • Majid Janzamin
  • Anima Anandkumar
چکیده

Fitting high-dimensional data involves a delicate tradeoff between faithful representation and the use of sparse models. Too often, sparsity assumptions on the fitted model are too restrictive to provide a faithful representation of the observed data. In this paper, we present a novel framework incorporating sparsity in different domains. We decompose the observed covariance matrix into a sparse Gaussian Markov model (with a sparse precision matrix) and a sparse independence model (with a sparse covariance matrix). Our framework incorporates sparse covariance and sparse precision estimation as special cases and thus introduces a richer class of high-dimensional models. We characterize sufficient conditions for identifiability of the two models, viz., Markov and independence models. We propose an efficient decomposition method based on a modification of the popular l1-penalized maximum-likelihood estimator (l1-MLE). We establish that our estimator is consistent in both the domains, i.e., it successfully recovers the supports of both Markov and independence models, when the number of samples n scales as n = Ω(d log p), where p is the number of variables and d is the maximum node degree in the Markov model. Our experiments validate these results and also demonstrate that our models have better inference accuracy under simple algorithms such as loopy belief propagation.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

High-Dimensional Covariance Decomposition into Sparse Markov and Independence Domains

In this paper, we present a novel framework incorporating a combination of sparse models in different domains. We posit the observed data as generated from a linear combination of a sparse Gaussian Markov model (with a sparse precision matrix) and a sparse Gaussian independence model (with a sparse covariance matrix). We provide efficient methods for decomposition of the data into two domains, ...

متن کامل

Models of Random Sparse Eigenmatrices & Bayesian Analysis of Multivariate Structure

We discuss probabilistic models of random covariance structures defined by distributions over sparse eigenmatrices. The decomposition of orthogonal matrices in terms of Givens rotations defines a natural, interpretable framework for defining distributions on sparsity structure of random eigenmatrices. We explore theoretical aspects and implications for conditional independence structures arisin...

متن کامل

Models of Random Sparse Eigenmatrices with Application to Bayesian Factor Analysis

We discuss a new class of models for random covariance structures defined by probability distributions over sparse eigenmatrices. The decomposition of orthogonal square matrices in terms of Givens rotations defines a natural, interpretable framework for defining prior distributions over the sparsity structure of random eigenmatrices. We explore some theoretical aspects and implications for cond...

متن کامل

Factored sparse inverse covariance matrices

Most HMM-based speech recognition systems use Gaussian mixtures as observation probability density functions. An important goal in all such systems is to improve parsimony. One method is to adjust the type of covariance matrices used. In this work, factored sparse inverse covariance matrices are introduced. Based on U DU factorization, the inverse covariance matrix can be represented using line...

متن کامل

Covariance decomposition in undirected Gaussian graphical models

The covariance between two variables in a multivariate Gaussian distribution is decomposed into a sum of path weights for all paths connecting the two variables in an undirected independence graph. These weights are useful in determining which variables are important in mediating correlation between the two path endpoints. The decomposition arises in undirected Gaussian graphical models and doe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of Machine Learning Research

دوره 15  شماره 

صفحات  -

تاریخ انتشار 2014